Mixture Density Networks
ثبت نشده
چکیده
p(t | x) t x x x = 0.8 = 0.5 = 0.2 Figure 7: Plot of the conditional probability densities of the target data, for various values of x, obtained by taking vertical slices through the contours in Figure 6, for x = 0:2, x = 0:5 and x = 0:8. It is clear that the Mixture Density Network is able to capture correctly the multimodal nature of the target data density function at intermediate values of x. x Figure 8: Plot of the priors i (x) as a function of x for the 3 kernel functions from the same Mixture Density Network as was used to plot Figure 6. At both small and large values of x, where the conditional probability density of the target data is unimodal, only one of the kernels has a prior probability which diiers signiicantly from zero. At intermediate values of x, where the conditional density is tri-modal, the three kernels have comparable priors.
منابع مشابه
Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملSelf-organizing mixture networks for probability density estimation
A self-organizing mixture network (SOMN) is derived for learning arbitrary density functions. The network minimizes the Kullback-Leibler information metric by means of stochastic approximation methods. The density functions are modeled as mixtures of parametric distributions. A mixture needs not to be homogenous, i.e., it can have different density profiles. The first layer of the network is si...
متن کاملOn the equivalence between kernel self-organising maps and self-organising mixture density networks
The kernel method has become a useful trick and has been widely applied to various learning models to extend their nonlinear approximation and classification capabilities. Such extensions have also recently occurred to the Self-Organising Map (SOM). In this paper, two recently proposed kernel SOMs are reviewed, together with their link to an energy function. The Self-Organising Mixture Network ...
متن کاملTime-dependent series variance learning with recurrent mixture density networks
This paper presents an improved nonlinear mixture density approach to modeling the time-dependent variance in time series. First, we elaborate a recurrent mixture density network for explicit modeling of the time conditional mixing coefficients, as well as the means and variances of its Gaussian mixture components. Second, we derive training equations with which all the network weights are infe...
متن کاملPulp Quality Modelling Using Bayesian Mixture Density Neural Networks
Abstract We model a part of a process in pulp to paper production using Bayesian mixture density networks. A set of parameters measuring paper quality is predicted from a set of process values. In most regression models, the response output is a real value but in this mixture density model the output is an approximation of the density function for a response variable conditioned by an explanato...
متن کاملProbability Estimation by Feed-forward Networks in Continuous Speech Recognition
We review the use of feed-forward networks as estimators of probability densities in hidden Markov modelling. In this paper we are mostly concerned with radial basis functions (RBF) networks. We note the isomorphism of RBF networks to tied mixture density estimators; additionally we note that RBF networks are trained to estimate posteriors rather than the likelihoods estimated by tied mixture d...
متن کامل